Optimal kernel choice for domain adaption learning
نویسندگان
چکیده
In this paper, a kernel choice method is proposed for domain adaption, referred to as Optimal Kernel Choice Domain Adaption (OKCDA). It learns a robust classier and parameters associate with Multiple Kernel Learning side by side. Domain adaption kernel-based learning strategy has shown outstanding performance. It embeds two domains of different distributions, namely, the auxiliary and the target domains, into Hilbert Space, and exploits the labeled data from the source domain to train a robust kernel-based SVM classier for the target domain. We reduce the distributions mismatch by setting up a test statistic between the two domains based on the Maximum Mean Discrepancy (MMD) algorithm and minimize the Type II error, given an upper bound on error I. Simultaneously, we minimize the structural risk functional. In order to highlight the advantages of the proposed method, we tackle a text classication problem on 20 Newsgroups dataset and Email Spam dataset. The results demonstrate that our method exhibits outstanding performance.
منابع مشابه
Feature Selection of Support Vector Domain Description Using Gaussian Kernel
The performance of the kernel-based learning algorithms, such as support vector domain description, depends heavily on the proper choice of the kernel parameter. It is desirable for the kernel machines to work on the optimal kernel parameter that adapts well to the input data and the pattern classification tasks. In this paper we present a novel algorithm to optimize the Gaussian kernel paramet...
متن کاملSVM Model Selection for Microarray Classification
Support vector machines (SVMs) [4] are gaining broad acceptance as state-of-the-art classifiers for microarray data analysis [3]. However, most studies that use SVMs to predict sample class consider only a small subset of SVM kernels and parameters. The effect of the kernel type and parameter values is usually not studied in microarray classification. The choice of kernel and classifier paramet...
متن کاملDeep Visual Domain Adaptation: A Survey
Deep domain adaption has emerged as a new learning technique to address the lack of massive amounts of labeled data. Compared to conventional methods, which learn shared feature subspaces or reuse important source instances with shallow representations, deep domain adaption methods leverage deep networks to learn more transferable representations by embedding domain adaptation in the pipeline o...
متن کاملTracking Approximate Solutions of Parameterized Optimization Problems over Multi-Dimensional (Hyper-)Parameter Domains
Many machine learning methods are given as parameterized optimization problems. Important examples of such parameters are regularizationand kernel hyperparameters. These parameters have to be tuned carefully since the choice of their values can have a significant impact on the statistical performance of the learning methods. In most cases the parameter space does not carry much structure and pa...
متن کاملAdvancing Bayesian Optimization: The Mixed-Global-Local (MGL) Kernel and Length-Scale Cool Down
Bayesian Optimization (BO) has become a core method for solving expensive black-box optimization problems. While much research focussed on the choice of the acquisition function, we focus on online length-scale adaption and the choice of kernel function. Instead of choosing hyperparameters in view of maximum likelihood on past data, we propose to use the acquisition function to decide on hyperp...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Eng. Appl. of AI
دوره 51 شماره
صفحات -
تاریخ انتشار 2016